Learning Efficient Tensor Representations with Ring Structure Networks
نویسندگان
چکیده
Tensor train (TT) decomposition is a powerful representation for high-order tensors, which has been successfully applied to various machine learning tasks in recent years. However, since the tensor product is not commutative, permutation of data dimensions makes solutions and TT-ranks of TT decomposition inconsistent. To alleviate this problem, we propose a permutation symmetric network structure by employing circular multilinear products over a sequence of low-order core tensors. This network structure can be graphically interpreted as a cyclic interconnection of tensors, and thus we call it tensor ring (TR) representation. We develop several efficient algorithms to learn TR representation with adaptive TR-ranks by employing low-rank approximations. Furthermore, mathematical properties are investigated, which enables us to perform basic operations in a computationally efficiently way by using TR representations. Experimental results on synthetic signals and real-world datasets demonstrate that the proposed TR network is more expressive and consistently informative than existing TT networks.
منابع مشابه
Supervised Learning with Quantum-Inspired Tensor Networks
Tensor networks are efficient representations of high-dimensional tensors which have been very successful for physics and mathematics applications. We demonstrate how algorithms for optimizing such networks can be adapted to supervised learning tasks by using matrix product states (tensor trains) to parameterize models for classifying images. For the MNIST data set we obtain less than 1% test s...
متن کاملSupervised Learning with Tensor Networks
Tensor networks are approximations of high-order tensors which are efficient to work with and have been very successful for physics and mathematics applications. We demonstrate how algorithms for optimizing tensor networks can be adapted to supervised learning tasks by using matrix product states (tensor trains) to parameterize non-linear kernel learning models. For the MNIST data set we obtain...
متن کاملIrreducibility of the tensor product of Albeverio's representations of the Braid groups $B_3$ and $B_4$
We consider Albeverio's linear representations of the braid groups $B_3$ and $B_4$. We specialize the indeterminates used in defining these representations to non zero complex numbers. We then consider the tensor products of the representations of $B_3$ and the tensor products of those of $B_4$. We then determine necessary and sufficient conditions that guarantee the irreducibility of th...
متن کاملMultiplying Modular Forms
The space of elliptic modular forms of fixed weight and level can be identified with a space of intertwining operators, from a holomorphic discrete series representation of SL2(R) to a space of automorphic forms. Moreover, multiplying elliptic modular forms corresponds to a branching problem involving tensor products of holomorphic discrete series representations. In this paper, we explicitly c...
متن کاملQuantum-chemical insights from deep tensor neural networks
Learning from data has led to paradigm shifts in a multitude of disciplines, including web, text and image search, speech recognition, as well as bioinformatics. Can machine learning enable similar breakthroughs in understanding quantum many-body systems? Here we develop an efficient deep learning approach that enables spatially and chemically resolved insights into quantum-mechanical observabl...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1705.08286 شماره
صفحات -
تاریخ انتشار 2017